Maximum Entropy and Minimal Mutual Information in a Nonlinear Model

نویسندگان

  • Fabian J. Theis
  • Elmar W. Lang
چکیده

In blind source separation, two different separation techniques are mainly used: Minimal Mutual Information (MMI), where minimization of the mutual output information yields an independent random vector, and Maximum Entropy (ME), where the output entropy is maximized. However, it is yet unclear why ME should solve the separation problem, ie. result in an independent vector. Amari has given a partial confirmation for ME in the linear case in [1], where he proves that under the assumption of vanishing expectancy of the sources ME does not change the solutions of MMI up to scaling and permutation. In this paper, we generalize Amari’s approach to nonlinear ICA problems, where random vectors have been mixed by output functions of layered neural networks. We show that certain solution points of MMI are kept fixed by ME if no scaling of the weight vectors is allowed. In general, ME however might leave those MMI solutions using diagonal weights in the first network layer. Therefore, we conclude this paper by suggesting that in nonlinear ME algorithms diagonal weights should be fixed in later epochs.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Comparison of maximum entropy and minimal mutual information in a nonlinear setting

In blind source separation (BSS), two di.erent separation techniques are mainly used: minimal mutual information (MMI), where minimization of the mutual output information yields an independent random vector, and maximum entropy (ME), where the output entropy is maximized. However, it is yet unclear why ME should solve the separation problem, i.e. result in an independent vector. Yang and Amari...

متن کامل

Minimal Models of Multidimensional Computations

The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output ...

متن کامل

Entropy/Length Profiles, Bounds on the Minimal Covering of Bipartite Graphs, and Trellis Complexity of Nonlinear Codes

In this paper, the trellis representation of nonlinear codes is studied from a new perspective. We introduce the new concept of entropy/length profile (ELP). This profile can be considered as an extension of the dimension/length profile (DLP) to nonlinear codes. This elaboration of the DLP, the entropy/length profiles, appears to be suitable to the analysis of nonlinear codes. Additionally and ...

متن کامل

General Multilayer Perceptron Demixer Scheme for Nonlinear Blind Signal Separation

A new technique is presented for instantaneous blind signal separation from nonlinear mixtures using a general neural network based demixer scheme. The nonlinear demixer model follows directly from the general mixer model. Thus in the first part of the paper we present such a general mixer model which includes the linear mixtures as a special case. In the second part we present the general fram...

متن کامل

Blind Source Separation Using Maximum Entropy Pdf Estimation Based on Fractional Moments

Recovering a set of independent sources which are linearly mixed is the main task of the blind source separation. Utilizing different methods such as infomax principle, mutual information and maximum likelihood leads to simple iterative procedures such as natural gradient algorithms. These algorithms depend on a nonlinear function (known as score or activation function) of source distributions....

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001